Accelerating backpropagation through dynamic self-adaptation

نویسندگان

  • Ralf Solomon
  • J. Leo van Hemmen
چکیده

Standard backpropagation and many procedures derived from it use the steepest-descent method to minimize a cost function. In this paper, we present a new genetic algorithm, dynamic self-adaptation, to accelerate steepest descent as it is used in iterative procedures. The underlying idea is to take the learning rate of the previous step, to increase and decrease it slightly, to evaluate the cost function for both new values of the learning rate, and to choose the one that gives the lower value of the cost function. In this way, the algorithm adapts itself locally to the cost function landscape. We present a convergence proof, estimate the convergence rate, and test the algorithm on several hard problems. As compared to standard backpropagation, the convergence rate can be improved by several orders of magnitude. Furthermore, dynamic self-adaptation can also be applied to several parameters simultaneously, such as the learning rate and momentum.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerating Backpropagation through Dynamic Self-adaptation

Standard backpropagation and many procedures derived from it use the steepest-descent method to minimize a cost function. In this paper, we present a new genetic algorithm, dynamic self-adaptation, to accelerate steepest descent as it is used in iterative procedures. The underlying idea is to take the learning rate of the previous step, to increase and decrease it slightly, to evaluate the cost...

متن کامل

A Novel Fast Backpropagation Learning Algorithm Using Parallel Tangent and Heuristic Line Search

In gradient based learning algorithms, the momentum has usually an improving effect in convergence rate and decreasing the zigzagging phenomena. However it sometimes causes the convergence rate to decrease. The Parallel Tangent (ParTan) gradient is used as deflecting method to improve the convergence. From the implementation point of view, it is as simple as the momentum. In fact this method is...

متن کامل

Coprocessors for special neural networks KOKOS and KOBOLD

In this paper we present a system for accelerating special kinds of neural networks. It is a hardware supported system consisting of diierent parts. A special-purpose neural coprocessor is connected to a personal computer (PC) by a special, asynchronous interface. Two diierent neural coprocessors are available, KO-KOS, a coprocessor for Kohonen's selforganizing map, and KOBOLD, accelerating bac...

متن کامل

Stable Dynamic Parameter Adaptation

A stability criterion for dynamic parameter adaptation is given. In the case of the learning rate of backpropagation, a class of stable algorithms is presented and studied, including a convergence proof.

متن کامل

On-line Step Size Adaptation Category: Algorithms and Architectures Sub-category: Online Learning Algorithms

Gradient-based methods are often used for optimization. They form the basis of several neural network training algorithms, including backpropagation. They are known to be slow, however. Several techniques exist for the acceleration of gradient-based optimization, but very few of them are applicable to stochastic or real-time optimization. This paper proposes a new step size adaptation technique...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural Networks

دوره 9  شماره 

صفحات  -

تاریخ انتشار 1996